10 research outputs found
Natural interaction framework for pedestrian navigation systems on mobile devices
Mobile Augmented Reality applications base on navigation frameworks try to promote interaction beyond the desktop by employing wearable sensors, which collect user's position, orientation or diverse types of activities. Most navigation frameworks track location and heading of the user in the global coordinate frame using Global Positioning System (GPS) data. On the other hand, in the wearable computing area researchers studied angular data of human b o y segments in the local coordinate frame using inertial orientation trackers. We propose a combination of global and local coordinate frame approaches and provide a context-aware interaction framework for mobile devices by seamlessly changing Graphical User Interfaces (GUIs) for pedestrians wandering in urban environments. The system is designed and tested on a Personal Digital Assistant (PDA) based handheld prototype mounted with a GPS receiver and inertial orientation tracker. It introduces a method to estimate orientation of a mobile user's hand. The recognition algorithm is based on state transitions triggered by time-line analysis of pitch angle and angular velocity of the orientation tracker. The prototype system can differentiate between three postures successfully. We associated each posture with different contexts which are of interest for pedestrian navigation systems: investigation, navigation and idle. Thus, we introduce the idea that once orientation trackers became part of mobile computers, they can be used to create natural interaction techniques with mobile computers. The prototype is tested successfully in two urban environments: Sabanci University campus area, . 9th International Istanbul Biennial venues in Beyoglu, Istanbul
Seamless interface transition mechanism for pedestrian navigation systems
This paper presents a posture recognition system for mobilecomputing devices to switch between different visualizationmodes seamlessly. It introduces a method to estimate orientationof a mobile user‘s hand. The recognition algorithm is based onstate transitions triggered by time-line analysis of pitch angle andangular velocity of an orientation sensor. Currently the system candifferentiate between three postures successfully. We associatedeach posture with different contexts which are of interest forpedestrian navigation systems: investigation, navigation and idle.We implemented a prototype system with an orientation trackerand GPS receiver connected to a PDA running an OpenGL|ES application.The system allows users to navigate and investigate in a campus enviroment
Augmented reality based user interfaces to assist fieldwork on excavation sites
Archaeological site excavation is a destructive and irreversible process. According to
archaeologists, there is a certain need to visualize and analyze the previously collected data
and completed work. Over the past years, researchers have developed virtual and augmented
reality (AR) applications for cultural heritage sites. The current applications are mainly
focused on AR context, where 3D virtual objects are integrated into the real environment
in real-time. They can be classifi ed into two main categories: mobile tour guides and
reconstructive tools of remains. Although there are examples of excavation analyzers in
indoor augmented reality and 3D virtual reality contexts, there are no such applications
which offer real-time on site digital assistance using outdoor augmented reality.
In our project, we present an outdoor augmented reality tool to assist fi eldwork of
archaeologists on excavation sites which provides a user friendly interface. Our prototype
consists of an ultra mobile PC embedded with a camera and connected with orientation and
positioning sensors. It serves as a navigation and annotation tool.
We provide a user scenario to explain the workfl ow: An archaeologist wants to work on
a point of interest (POI) in the excavation site. He/she makes observations on the provided
vectorial map and gets information of the POIs on graphical interface widgets. Upon selecting
POI to investigate further using the stylus of the mobile PC, the POI’s detailed information
sheet appears. Besides this navigation property, our tool allows the archaeologists to model the
remains of the port walls in real-time, where AR context is activated. By lifting the mobile PC
to his/her gaze direction, camera input starts and AR interface offers the 2D model of POI. The
user selects reference points of the wall on the video input by considering the corresponding
points of the 2D model and completes the 3D modeling process. The archaeologist can later
work on the remains by marking and annotating the different strata on the video input.
Thus, a complete 3D model of POI is created on site with layer information.
We are currently testing our prototype at the Yenikapi excavation site in Istanbul. This
area is considered as the most exciting and important archaeological discovery in the history
of Istanbul where the archaeologists uncovered an ancient port of Constantinople with
perfectly preserved ancient ships’ skeletons. Since 2005, a team of archaeologists, geologists
and authorities of Istanbul Archaeology Museum are working on the remains on the fourthcentury
port, Portus Theodosiacus
DEMO_107: Social Documentary: exploring together a collection of crowdsourced videos via a tangible interface
<p>DEMO_107: Social Documentary: exploring together a collection of crowdsourced videos via a tangible interface</p